Orthogonal Low Rank Tensor Approximation: Alternating Least Squares Method and Its Global Convergence

نویسندگان

  • Liqi Wang
  • Moody T. Chu
  • Bo Yu
چکیده

With the notable exceptions of two cases — that tensors of order 2, namely, matrices, always have best approximations of arbitrary low ranks and that tensors of any order always have the best rank-one approximation, it is known that high-order tensors may fail to have best low rank approximations. When the condition of orthogonality is imposed, even under the modest assumption that only one set of components in the decomposed rank-one tensors is required to be mutually perpendicular, the situation is changed completely — orthogonal low rank approximations always exist. The purpose of this paper is to discuss the best low rank approximation subject to orthogonality. The conventional high-order power method is modified to address the orthogonality via the polar decomposition. Algebraic geometry technique is employed to show that for almost all tensors the orthogonal alternating least squares method converges globally.

برای دانلود رایگان متن کامل این مقاله و بیش از 32 میلیون مقاله دیگر ابتدا ثبت نام کنید

ثبت نام

اگر عضو سایت هستید لطفا وارد حساب کاربری خود شوید

منابع مشابه

On the Global Convergence of the Alternating Least Squares Method for Rank-One Approximation to Generic Tensors

Tensor decomposition has important applications in various disciplines, but it remains an extremely challenging task even to this date. A slightly more manageable endeavor has been to find a low rank approximation in place of the decomposition. Even for this less stringent undertaking, it is an established fact that tensors beyond matrices can fail to have best low rank approximations, with the...

متن کامل

Convergence of Alternating Least Squares Optimisation for Rank-One Approximation to High Order Tensors

The approximation of tensors has important applications in various disciplines, but it remains an extremely challenging task. It is well known that tensors of higher order can fail to have best low-rank approximations, but with an important exception that best rank-one approximations always exists. The most popular approach to low-rank approximation is the alternating least squares (ALS) method...

متن کامل

Local Convergence of the Alternating Least Squares Algorithm for Canonical Tensor Approximation

A local convergence theorem for calculating canonical low-rank tensor approximations (PARAFAC, CANDECOMP) by the alternating least squares algorithm is established. The main assumption is that the Hessian matrix of the problem is positive definite modulo the scaling indeterminacy. A discussion, whether this is realistic, and numerical illustrations are included. Also regularization is addressed.

متن کامل

Low-tubal-rank Tensor Completion using Alternating Minimization

The low-tubal-rank tensor model has been recently proposed for real-world multidimensional data. In this paper, we study the low-tubal-rank tensor completion problem, i.e., to recover a third-order tensor by observing a subset of its elements selected uniformly at random. We propose a fast iterative algorithm, called Tubal-AltMin, that is inspired by a similar approach for low-rank matrix compl...

متن کامل

Computing low-rank approximations of large-scale matrices with the Tensor Network randomized SVD

We propose a new algorithm for the computation of a singular value decomposition (SVD) low-rank approximation of a matrix in the Matrix Product Operator (MPO) format, also called the Tensor Train Matrix format. Our tensor network randomized SVD (TNrSVD) algorithm is an MPO implementation of the randomized SVD algorithm that is able to compute dominant singular values and their corresponding sin...

متن کامل

ذخیره در منابع من


  با ذخیره ی این منبع در منابع من، دسترسی به آن را برای استفاده های بعدی آسان تر کنید

عنوان ژورنال:
  • SIAM J. Matrix Analysis Applications

دوره 36  شماره 

صفحات  -

تاریخ انتشار 2015